"Everything I can't have in this world is because of that thing down there. If you wanna see who I am, that's the last place you should look."
This is a continuation of my previous post titled 'Reality is (Probably) Illusionary'. In the same spirit, I will continue to argue that we place too much emphasis on the singular reality that reveals itself vs those that lurk in the obscure realm of non-existence. I discuss here the distinction between the results of randomness and the processes responsible for generating such variability in the first place.
Consider the below financial price chart. How large do you think the timesteps are - is this represented in seconds, days, or perhaps years? It is impossible to know because such time-series are visually fractal. They have self-similiarity across different scales and yet, the generating processes are not the same. One can apply a successful trend-following algorithm at certain timescales but not at others (at certain timescales markets mean-revert). Generally speaking, focussing too much on the visual can be extremely misleading in terms of understanding the actual mechanics of a process. Retail forex traders often fall prey to this nuance by being encouraged to trade at faster timescales without realising that the signal to noise ratio decreases upon magnifying the time-series.
import numpy as np
import matplotlib.pyplot as plt
# Set random seed for reproducibility
np.random.seed(42)
# Number of time steps
num_steps = 1000
# Generate random increments (random walk)
increments = np.random.normal(0, 1, num_steps)
# Generate prices from increments
prices = 100 + np.cumsum(increments)
# Plotting
plt.figure(figsize=(10, 6))
plt.plot(prices, color='blue', lw=1)
plt.xlabel('Time Steps')
plt.ylabel('Price')
plt.grid(True)
plt.show()
The previous example may be extended to other aspects of life - instead of updating one's sense of self based on the things which have happened, perhaps one should also consider personality itself to operate on randomly distributed norms. I may be an upstanding citizen who pays his taxes in this reality but what if in another iteration, an unfortunate series of events compels me to become a common criminal. Or better yet, what if the randomised genetic hand I have been dealt with is so disadvantageous that I am coerced to operate on another playing field just to survive. The below code follows a simple normal distribution to assign innate characteristics to five different humans. What comes to mind when you envisage each individual - do certain biases or judgements immediately appear? If so, you are likely human since your heuristics force you to, even though you know damned well that the result-generating process is completely random.
It might sound like I am being pedantic, or insensitive by randomly allocating character traits but it is truly quite the opposite. If anything, the point I am trying to make is that what we see is oftentimes a shallow illumination of what a person represents. Instead of looking at the effects of randomness, the aperture we should focus on is the effect in spite of randomness. In other words, we should commend the poker player who is able to consistently demonstrate skill and tact even when she does not receive a winning hand. The resultant winner conversely may be of little interest when we condition on the blessing (or curse) of stochasticity. Unlike financial markets, people tend to demonstrate higher signal to noise ratios upon magnification.
import numpy as np
import random
def generate_height():
# Assuming mean height of 170 cm with standard deviation of 10 cm
return np.random.normal(loc=170, scale=10)
def generate_gender():
# 50/50 chance of being male or female
return random.choice(['Male', 'Female'])
def generate_race():
# Define proportions of different races in the world
races = ['Asian', 'White', 'Black', 'Hispanic', 'Other']
proportions = [0.60, 0.16, 0.12, 0.09, 0.03]
return np.random.choice(races, p=proportions)
def generate_iq():
# Assuming IQ follows a normal distribution with mean of 100 and standard deviation of 15
return np.random.normal(loc=100, scale=15)
def generate_social_class():
# Define social classes and their proportions
social_classes = ['Upper Class', 'Middle Class', 'Working Class']
proportions = [0.1, 0.6, 0.3]
return np.random.choice(social_classes, p=proportions)
# Generate random data for 100 individuals
num_individuals = 5
data = []
for _ in range(num_individuals):
height = generate_height()
gender = generate_gender()
race = generate_race()
iq = generate_iq()
social_class = generate_social_class()
data.append({'Height': height, 'Gender': gender, 'Race': race, 'IQ': iq, 'Social Class': social_class})
# Print the generated data
for i, person in enumerate(data):
print(f"Person {i+1}: Height: {person['Height']:.2f} cm, Gender: {person['Gender']}, Race: {person['Race']}, IQ: {person['IQ']:.2f}, Social Class: {person['Social Class']}")
Person 1: Height: 177.72 cm, Gender: Female, Race: Asian, IQ: 57.27, Social Class: Middle Class Person 2: Height: 166.38 cm, Gender: Female, Race: Hispanic, IQ: 83.20, Social Class: Middle Class Person 3: Height: 157.05 cm, Gender: Female, Race: Black, IQ: 117.41, Social Class: Middle Class Person 4: Height: 169.53 cm, Gender: Female, Race: Asian, IQ: 107.16, Social Class: Upper Class Person 5: Height: 170.77 cm, Gender: Male, Race: Asian, IQ: 80.76, Social Class: Working Class
These days it seems like every other new series that crops up on Netflix is a true crime documentary. I have always found these fascinating; the only compelling reason to give serial murderers and psychopaths a platform for fame is to act as a cautionary tale assuming that we are accurately representing the likelihood of such occurrences. Given how desperate Netflix productions seem to be in rehashing stories from decades past, I believe this to be highly unlikely. In a similar vain, many news platforms make use of sensationalist headlines and stories which hardly constitute news - they sell the tail end of the distribution distorting subscribers' probabilistic view of real life occurrences. It would be more informative to advise audiences of all the things that did not happen on a particular day to set proportions straight but I suppose that would not make for a very captivating news piece. When we do not set our sights on the distributive properties of the process, we can easily fall prey to wayward assumptions about what the world really looks like. Consider the below example where a news article headline is generated based on the maximal earnings of a crypto investor - not very indicative when we look at the broader picture...
import numpy as np
import matplotlib.pyplot as plt
def simulate_distribution(num_days, mean, std_dev):
return np.random.normal(mean, std_dev, size=num_days)
def generate_news_report(earnings, mean):
max_earnings_day = np.argmax(earnings)
max_earnings = earnings[max_earnings_day]
print(f"MAJOR NEWS: Today Crypto Chad made a profit of ${round(max_earnings)}!")
print(f"minor news: The average crypto profit is ${mean}.")
def plot_series(earnings):
plt.figure(figsize=(10, 6))
plt.plot(traffic_flow, color='blue')
plt.xlabel('n (crypto-bro)')
plt.ylabel('Profits (USD)')
plt.grid(True)
plt.show()
num_days = 100
mean = 0
std_dev = 100000
earnings = simulate_distribution(num_days, mean, std_dev)
plot_series(earnings)
generate_news_report(earnings, mean)
MAJOR NEWS: Today Crypto Chad made a profit of $294460! minor news: The average crypto profit is $0.
What do you know to be true? Like really, really know? If you strip the randomness away from someone, what's left?
Let's start with a simplistic model for a human life in the below animation. You are represented by a red ball bouncing within some pre-defined boundary. Unfortunately as a human, your existence is finite and so your circular boundary is slowly shrinking. The truth slowly start to sink in when you realise the non-ergodic nature of your playpen. Untold realities, encounters, skills slowly being shed until one final path appears. You have reached your long-awaited terminus - a single point and finally, no more unknowns, branches and sub-branches. This is it. You were meant to arrive here. In fact, no matter how many times the simulation is run, you were always going to end up here. At last, you have become you. Except what if this wasn't the best you? You cast your mind back to travelling with your parents as a kid - you wish you could tell them that you love them and that you were grateful they put up with you for all those years. You think about your best friends from university who you stopped frequenting, certainly over something petty. You think about the moments you were the most kind, but also the moments you were not yourself. You would do so much differently - you would be better. You would have spent less time feeling sorry for yourself and more time realising how fortunate you once were to have such a large circle to bounce off from. The boundaries slowly close in on you as one final thought prepounds itself: the next time around, you will focus less on what is and more on what can be.
import numpy as np
import matplotlib.pyplot as plt
from matplotlib.animation import FuncAnimation
from IPython.display import HTML
import warnings
warnings.simplefilter(action='ignore', category=FutureWarning)
# Parameters
initial_radius_large_circle = 10
radius_small_ball = 0.5
speed = 0.5
interval = 15
# Create figure and axis
fig, ax = plt.subplots()
ax.set_aspect('equal')
# Function to initialize the plot
def init():
ax.set_xlim(-initial_radius_large_circle, initial_radius_large_circle)
ax.set_ylim(-initial_radius_large_circle, initial_radius_large_circle)
# Create larger circle
global larger_circle
larger_circle = plt.Circle((0, 0), initial_radius_large_circle, color='b', fill=False)
ax.add_artist(larger_circle)
# Create smaller ball
global small_ball
small_ball = plt.Circle((0, 0), radius_small_ball, color='r', zorder=10)
ax.add_artist(small_ball)
# Initialize position and velocity of the ball
global x, y, dx, dy, current_radius_large_circle
x = np.random.uniform(-initial_radius_large_circle + radius_small_ball, initial_radius_large_circle - radius_small_ball)
y = np.random.uniform(-initial_radius_large_circle + radius_small_ball, initial_radius_large_circle - radius_small_ball)
dx, dy = np.random.uniform(-speed, speed), np.random.uniform(-speed, speed)
current_radius_large_circle = initial_radius_large_circle
return larger_circle, small_ball
# Update function
def update(frame):
global x, y, dx, dy, current_radius_large_circle
# Update position
x += dx
y += dy
# Check for collision with boundaries of the shrinking circle
distance_to_center = np.sqrt(x**2 + y**2)
if distance_to_center >= current_radius_large_circle - radius_small_ball:
# Calculate reflection vector
reflection_vector = np.array([x, y]) / distance_to_center
dot_product = np.dot([dx, dy], reflection_vector)
dx -= 2 * dot_product * reflection_vector[0]
dy -= 2 * dot_product * reflection_vector[1]
# Ensure the ball stays within the shrinking circle
x = reflection_vector[0] * (current_radius_large_circle - radius_small_ball)
y = reflection_vector[1] * (current_radius_large_circle - radius_small_ball)
# Update ball position
small_ball.set_center((x, y))
# Update the radius of the larger circle every second
if frame % (1000 // interval) == 0:
current_radius_large_circle -= 1
larger_circle.set_radius(current_radius_large_circle)
return larger_circle, small_ball
# Create animation
ani = FuncAnimation(fig, update, init_func=init, frames=range(1000), blit=True, interval=interval)
# Clear the plot before displaying the animation
plt.close()
# Convert the animation to JavaScript HTML
js_html = ani.to_jshtml()
# Display the animation
HTML(js_html)
Animation size has reached 20985969 bytes, exceeding the limit of 20971520.0. If you're sure you want a larger animation embedded, set the animation.embed_limit rc parameter to a larger value (in MB). This and further frames will be dropped.